Neural Headline Generation on Abstract Meaning Representation
نویسندگان
چکیده
Neural network-based encoder-decoder models are among recent attractive methodologies for tackling natural language generation tasks. This paper investigates the usefulness of structural syntactic and semantic information additionally incorporated in a baseline neural attention-based model. We encode results obtained from an abstract meaning representation (AMR) parser using a modified version of Tree-LSTM. Our proposed attention-based AMR encoder-decoder model improves headline generation benchmarks compared with the baseline neural attention-based model.
منابع مشابه
Conceptual Multi-layer Neural Network Model for Headline Generation
Neural attention-based models have been widely used recently in headline generation by mapping source document to target headline. However, the traditional neural headline generation models utilize the first sentence of the document as the training input while ignoring the impact of the document concept information on headline generation. In this work, A new neural attention-based model called ...
متن کاملFrom Neural Sentence Summarization to Headline Generation: A Coarse-to-Fine Approach
Headline generation is a task of abstractive text summarization, and previously suffers from the immaturity of natural language generation techniques. Recent success of neural sentence summarization models shows the capacity of generating informative, fluent headlines conditioned on selected recapitulative sentences. In this paper, we investigate the extension of sentence summarization models t...
متن کاملLow-Resource Neural Headline Generation
Recent neural headline generation models have shown great results, but are generally trained on very large datasets. We focus our efforts on improving headline quality on smaller datasets by the means of pretraining. We propose new methods that enable pre-training all the parameters of the model and utilize all available text, resulting in improvements by up to 32.4% relative in perplexity and ...
متن کاملAbstract Meaning Representation Parsing using LSTM Recurrent Neural Networks
We present a system which parses sentences into Abstract Meaning Representations, improving state-of-the-art results for this task by more than 5%. AMR graphs represent semantic content using linguistic properties such as semantic roles, coreference, negation, and more. The AMR parser does not rely on a syntactic preparse, or heavily engineered features, and uses five recurrent neural networks ...
متن کاملReversing F-structure Rewriting for Generation from Meaning Representations
We describe the design of an LFG-based generation system that provides a framework for empirical studies on the choice among grammatical paraphrases (i.e. syntactic alternations), as an effect of interacting soft constraints. To be able to study the relevant variation, we extend the XLE generation architecture so it no longer departs from standard f-structures, but from a more abstract level of...
متن کامل